Mining Matrix Data with Bregman Matrix Divergences for Portfolio Selection

نویسندگان

  • Richard Nock
  • Brice Magdalou
  • Eric Briys
  • Frank Nielsen
چکیده

If only we always knew ahead of time.... The dream of any stock portfolio manager 1 is to allocate stocks in his portfolio in hindsight so as to always reach maximum 2 wealth. With hindsight, over a given time period, the best strategy is to invest into 3 the best performing stock over that period. However, even this appealing strategy is 4 not without regret. Reallocating everyday to the best stock in hindsight (that is with 5 a perfect sense for ups and downs timing) notwithstanding, Cover has shown that a 6 Constant Rebalancing Portfolio (CRP) strategy can deliver superior results [10]. 7 These superior portfolios have been named Universal Portfolios (UP). In other 8 words, if one follows Cover’s advice, a non anticipating portfolio allocation per9 forms (asymptotically) as well as the best constant rebalancing portfolio allocation 10 determined in hindsight. This UP allocation is however not costless as it replicates 11 the payoff, if it existed, of an exotic option, namely a hindsight allocation option. 12 Buying this option, if it were traded, would enable a fund manager to behave as if 13 he always knew everything in hindsight. 14 Findinguseful portfolio allocations, like theCRPallocation, is not however always 15 related to the desire to outperform some pre-agreed benchmark. As Markowitz has 16 shown, investors know that they cannot achieve stock returns greater than the risk17 free rate without having to carry some risk [17]. Markowitz designed a decision 18 criterion which, taking both risk and return into account, enables any investor to 19 compute the weights of each individual stock in his preferred portfolio. The investor 20 is assumed to like return but to dislike risk: this is the much celebratedmean-variance 21

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Matrix Nearness Problems with Bregman Divergences

This paper discusses a new class of matrix nearness problems that measure approximation error using a directed distance measure called a Bregman divergence. Bregman divergences offer an important generalization of the squared Frobenius norm and relative entropy, and they all share fundamental geometric properties. In addition, these divergences are intimately connected with exponential families...

متن کامل

Low-Rank Kernel Learning with Bregman Matrix Divergences

In this paper, we study low-rank matrix nearness problems, with a focus on learning lowrank positive semidefinite (kernel) matrices for machine learning applications. We propose efficient algorithms that scale linearly in the number of data points and quadratically in the rank of the input matrix. Existing algorithms for learning kernel matrices often scale poorly, with running times that are c...

متن کامل

Efficient Bregman Range Search

We develop an algorithm for efficient range search when the notion of dissimilarity is given by a Bregman divergence. The range search task is to return all points in a potentially large database that are within some specified distance of a query. It arises in many learning algorithms such as locally-weighted regression, kernel density estimation, neighborhood graph-based algorithms, and in tas...

متن کامل

Information Geometry for Radar Target Detection with Total Jensen–Bregman Divergence

Abstract: This paper proposes a radar target detection algorithm based on information geometry. In particular, the correlation of sample data is modeled as a Hermitian positive-definite (HPD) matrix. Moreover, a class of total Jensen–Bregman divergences, including the total Jensen square loss, the total Jensen log-determinant divergence, and the total Jensen von Neumann divergence, are proposed...

متن کامل

Generalized Nonnegative Matrix Approximations with Bregman Divergences

Nonnegative matrix approximation (NNMA) is a recent technique for dimensionality reduction and data analysis that yields a parts based, sparse nonnegative representation for nonnegative input data. NNMA has found a wide variety of applications, including text analysis, document clustering, face/image recognition, language modeling, speech processing and many others. Despite these numerous appli...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2012